freak statistical occurrence
On Trolleys, Self-Driving Cars, and Missing the Forest for the Trees.
I felt that there were several ethical dilemmas regarding questions of liability and moral responsibility when it came to self-driving cars. Although the questions didn't have answers yet, given the likely financial might of the companies that will end up building and operating self-driving cars, the answers that will be drawn up for those dilemmas will definitely not be in the favor of the general public. For example: A person is injured by self-driving car due to a misclassification error by an embedded computer vision model, which usually has a 99.99% accuracy rate. The autonomous car companies will have lobbied to have that type of event classified as a freak statistical occurrence to avoid being held liable for ensuing damages and injuries. After all, in today's world, no hardware manufacturer is expected to achieve 100% reliability with their products, and a Convolutional Neural Network is just another technological artifact. On the other hand, a human driver who for the last 30 years has never run red a light because he has never mistaken it for being green, but who for the first time today accidentally misclassifies the color of the crossing signal, and ended up hurting someone in the process, won't be able to claim that his red classification accuracy has been 99.9995% so far, and this was just a freak statistical occurrence.
- North America > United States (0.07)
- Asia > China (0.06)
- Transportation > Passenger (1.00)
- Transportation > Ground > Road (1.00)
- Information Technology (1.00)
- Automobiles & Trucks (1.00)